BFGS Method : A New Search Direction (
نویسندگان
چکیده
In this paper we present a new line search method known as the HBFGS method, which uses the search direction of the conjugate gradient method with the quasi-Newton updates. The Broyden-Fletcher-Goldfarb-Shanno (BFGS) update is used as approximation of the Hessian for the methods. The new algorithm is compared with the BFGS method in terms of iteration counts and CPU-time. Our numerical analysis provides strong evidence that the proposed HBFGS method is more efficient than the ordinary BFGS method. Besides, we also prove that the new algorithm is globally convergent.
منابع مشابه
Modify the linear search formula in the BFGS method to achieve global convergence.
<span style="color: #333333; font-family: Calibri, sans-serif; font-size: 13.3333px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: justify; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; background-color: #ffffff; text-dec...
متن کاملA Line Search Multigrid Method for Large-Scale Nonlinear Optimization
Abstract. We present a line search multigrid method for solving discretized versions of general unconstrained infinite dimensional optimization problems. At each iteration on each level, the algorithm computes either a “direct search” direction on the current level or a “recursive search” direction from coarser level models. Introducing a new condition that must be satisfied by a backtracking l...
متن کاملNew Quasi-Newton Optimization Methods for Machine Learning
This thesis develops new quasi-Newton optimization methods that exploit the wellstructured functional form of objective functions often encountered in machine learning, while still maintaining the solid foundation of the standard BFGS quasi-Newton method. In particular, our algorithms are tailored for two categories of machine learning problems: (1) regularized risk minimization problems with c...
متن کاملA Quasi-Newton Approach to Nonsmooth Convex Optimization Problems in Machine Learning
We extend the well-known BFGS quasi-Newton method and its memory-limited variant LBFGS to the optimization of nonsmooth convex objectives. This is done in a rigorous fashion by generalizing three components of BFGS to subdifferentials: the local quadratic model, the identification of a descent direction, and the Wolfe line search conditions. We prove that under some technical conditions, the re...
متن کاملA Quasi-Newton Approach to Nonsmooth Convex Optimization
We extend the well-known BFGS quasiNewton method and its limited-memory variant (LBFGS) to the optimization of nonsmooth convex objectives. This is done in a rigorous fashion by generalizing three components of BFGS to subdifferentials: The local quadratic model, the identification of a descent direction, and the Wolfe line search conditions. We apply the resulting sub(L)BFGS algorithm to L2-re...
متن کامل